Multiplicative Updates for Elastic Net Regularized Convolutional NMF Under $\beta$-Divergence
نویسندگان
چکیده
We generalize the convolutional NMF by taking the β-divergence as the loss function, add a regularizer for sparsity in the form of an elastic net, and provide multiplicative update rules for its factors in closed form. The new update rules embed the β-NMF, the standard convolutional NMF, and sparse coding alias basis pursuit. We demonstrate that the originally published update rules for the convolutional NMF are suboptimal and that their convergence rate depends on the size of the kernel.
منابع مشابه
Algorithms for nonnegative matrix factorization with the beta-divergence
This paper describes algorithms for nonnegative matrix factorization (NMF) with the β-divergence (β-NMF). The β-divergence is a family of cost functions parametrized by a single shape parameter β that takes the Euclidean distance, the Kullback-Leibler divergence and the Itakura-Saito divergence as special cases (β = 2, 1, 0 respectively). The proposed algorithms are based on a surrogate auxilia...
متن کاملEfficient Multiplicative Updates for Support Vector Machines
The dual formulation of the support vector machine (SVM) objective function is an instance of a nonnegative quadratic programming problem. We reformulate the SVM objective function as a matrix factorization problem which establishes a connection with the regularized nonnegative matrix factorization (NMF) problem. This allows us to derive a novel multiplicative algorithm for solving hard and sof...
متن کاملAlgorithms for Nonnegative Matrix Factorization with the β-Divergence
This letter describes algorithms for nonnegative matrix factorization (NMF) with the β-divergence (β-NMF). The β-divergence is a family of cost functions parameterized by a single shape parameter β that takes the Euclidean distance, the Kullback-Leibler divergence, and the Itakura-Saito divergence as special cases (β = 2, 1, 0 respectively). The proposed algorithms are based on a surrogate auxi...
متن کاملMahNMF: Manhattan Non-negative Matrix Factorization
Non-negative matrix factorization (NMF) approximates a non-negative matrix X by a product of two non-negative low-rank factor matrices W and H . NMF and its extensions minimize either the Kullback-Leibler divergence or the Euclidean distance between X and WH to model the Poisson noise or the Gaussian noise. In practice, when the noise distribution is heavy tailed, they cannot perform well. This...
متن کاملGeneralized Alpha-Beta Divergences and Their Application to Robust Nonnegative Matrix Factorization
We propose a class of multiplicative algorithms for Nonnegative Matrix Factorization (NMF) which are robust with respect to noise and outliers. To achieve this, we formulate a new family generalized divergences referred to as the Alpha-Beta-divergences (AB-divergences), which are parameterized by the two tuning parameters, alpha and beta, and smoothly connect the fundamental Alpha-, Betaand Gam...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2018